Sparse Bayesian kernel logistic regression
نویسندگان
چکیده
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based MacKay’s evidence approximation. The model is re-parameterised such that an isotropic Gaussian prior over parameters in the kernel induced feature space is replaced by an isotropic Gaussian prior over the transformed parameters, facilitating a Bayesian analysis using standard methods. The Bayesian approach allows the selection of “good” values for the usual regularisation and kernel parameters through maximisation of the marginal likelihood. Results obtained on a variety of benchmark datasets are provided indicating that the Bayesian kernel logistic regression model is competitive, whilst having one less parameter to determine during model selection.
منابع مشابه
The evidence framework applied to sparse kernel logistic regression
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based on the evidence framework introduced by MacKay. The principal innovation lies in the re-parameterisation of the model such that the usual spherical Gaussian prior over the parameters in the kernel induced feature space also corresponds to a spherical Gaussian prior over t...
متن کاملA Gradient-based Forward Greedy Algorithm for Sparse Gaussian Process Regression
In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...
متن کاملA Gradient-Based Forward Greedy Algorithm for Space Gaussian Process Regression
In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...
متن کاملBayesian Approximate Kernel Regression with Variable Selection
Nonlinear kernel regression models are often used in statistics and machine learning due to greater accuracy than linear models. Variable selection for kernel regression models is a challenge partly because, unlike the linear regression setting, there is no clear concept of an effect size for regression coefficients. In this paper, we propose a novel framework that provides an analog of the eff...
متن کاملAdaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression
Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic...
متن کامل